10 research outputs found

    A two-directional 1-gram visual motion sensor inspired by the fly's eye

    No full text
    International audienceOptic flow based autopilots for Micro-Aerial Vehicles (MAVs) need lightweight, low-power sensors to be able to fly safely through unknown environments. The new tiny 6-pixel visual motion sensor presented here meets these demanding requirements in term of its mass, size and power consumption. This 1-gram, low-power, fly-inspired sensor accurately gauges the visual motion using only this 6-pixel array with two different panoramas and illuminance conditions. The new visual motion sensor's output results from a smart combination of the information collected by several 2-pixel Local Motion Sensors (LMSs), based on the \enquote{time of travel} scheme originally inspired by the common housefly's Elementary Motion Detector (EMD) neurons. The proposed sensory fusion method enables the new visual sensor to measure the visual angular speed and determine the main direction of the visual motion without any prior knowledge. By computing the median value of the output from several LMSs, we also ended up with a more robust, more accurate and more frequently refreshed measurement of the 1-D angular speed

    Toward a fully autonomous hovercraft visually guided thanks to its own bio-inspired motion sensors

    No full text
    Based on a biorobotic approach developed in our laboratory over the past 25 years, we have designed and built several terrestrial and aerial vehicles controlling their position and speed on the basis of optic flow cues. In particular, in our project on the autonomous guidance of Micro-Air Vehicles (MAVs) in confined indoor and outdoor environments, we have developed a vision-based autopilot, which is called LORA III (Lateral Optic flow Regulation Autopilot, Mark III). This autopilot, based on the dual optic flow regulation, allows an air vehicle to travel along a corridor by automatically controlling both its speed and its clearance from the walls. The optic flow regulation is a feedback control based on an optic flow sensor, which strives to maintain a perceived optic flow at a constant set-point by adjusting a thrust. The LORA III autopilot consists of a dual optic flow regulator in which each regulator has its own optic flow set-point and controls the robot's translation in one degree of freedom: a bilateral optic flow regulator controls the robot's forward speed, while a unilateral optic flow regulator controls the side thrust, making the robot avoid the walls of the corridor. This autopilot draws on former studies which aimed to understand how a honeybee might be able to center along a corridor, to follow a single wall, and to adjust its speed according to the corridor width. Computer-simulated experiments have shown that a miniature hovercraft equipped with the LORA III autopilot can navigate along a straight or tapered corridor at a relatively high speed (up to 1m/s). The minimalistic visual system (comprised of only four pixels) may suffice for the hovercraft to be able to control both its clearance from the walls and its forward speed jointly, without ever measuring speed or distance, in a similar manner to what honeybees are thought to be capable of. The LORA robot is equipped with two rear thrusters and two lateral thrusters, in addition to the lift fan used to inflate the skirt. The hovercraft can move freely without any umbilicus, which makes its system identification and its own locomotion easier. However, the dynamics of all five motors turned out to be highly sensitive to the drop in supply voltage of the onboard Lithium Polymer (Li-Po) batteries. This is a critical issue for the identification of the robot's dynamical parameters. To perform an efficient system identification of the hovercraft's dynamics, we decided to confer upon each motor a dedicated controller that would make the four thrusters and the lift fan robust to any variations in the battery supply voltage

    A fully-autonomous hovercraft inspired by bees: wall following and speed control in straight and tapered corridors

    No full text
    International audienceThe small autonomous vehicles of the future will have to navigate close to obstacles in highly unpredictable environments. Risky tasks of this kind may require novel sensors and control methods that differ from conventional approaches. Recent ethological findings have shown that complex navigation tasks such as obstacle avoidance and speed control are performed by flying insects on the basis of optic flow (OF) cues, although insects' compound eyes have a very poor spatial resolution. The present paper deals with the implementation of an optic flow-based autopilot on a fully autonomous hovercraft. Tests were performed on this small (878-gram) innovative robotic platform in straight and tapered corridors lined with natural panoramas. A bilateral OF regulator controls the robot's forward speed (up to 0.8 m/s), while a unilateral OF regulator controls the robot's clearance from the two walls. A micro-gyrometer and a tiny magnetic compass ensure that the hovercraft travels forward in the corridor without yawing. The lateral OFs are measured by two minimalist eyes mounted sideways opposite to each other. For the first time, the hovercraft was found to be capable of adjusting both its forward speed and its clearance from the walls, in both straight and tapered corridors, without requiring any distance or speed measurements}, that is, without any need for on-board rangefinders or tachometers

    Insect inspired visual motion sensing and flying robots

    Get PDF
    International audienceFlying insects excellently master visual motion sensing techniques. They use dedicated motion processing circuits at a low energy and computational costs. Thanks to observations obtained on insect visual guidance, we developed visual motion sensors and bio-inspired autopilots dedicated to flying robots. Optic flow-based visuomotor control systems have been implemented on an increasingly large number of sighted autonomous robots. In this chapter, we present how we designed and constructed local motion sensors and how we implemented bio-inspired visual guidance scheme on-board several micro-aerial vehicles. An hyperacurate sensor in which retinal micro-scanning movements are performed via a small piezo-bender actuator was mounted onto a miniature aerial robot. The OSCAR II robot is able to track a moving target accurately by exploiting the microscan-ning movement imposed to its eye's retina. We also present two interdependent control schemes driving the eye in robot angular position and the robot's body angular position with respect to a visual target but without any knowledge of the robot's orientation in the global frame. This "steering-by-gazing" control strategy, which is implemented on this lightweight (100 g) miniature sighted aerial robot, demonstrates the effectiveness of this biomimetic visual/inertial heading control strategy

    Honeybees' Speed Depends on Dorsal as Well as Lateral, Ventral and Frontal Optic Flows

    Get PDF
    Flying insects use the optic flow to navigate safely in unfamiliar environments, especially by adjusting their speed and their clearance from surrounding objects. It has not yet been established, however, which specific parts of the optical flow field insects use to control their speed. With a view to answering this question, freely flying honeybees were trained to fly along a specially designed tunnel including two successive tapering parts: the first part was tapered in the vertical plane and the second one, in the horizontal plane. The honeybees were found to adjust their speed on the basis of the optic flow they perceived not only in the lateral and ventral parts of their visual field, but also in the dorsal part. More specifically, the honeybees' speed varied monotonically, depending on the minimum cross-section of the tunnel, regardless of whether the narrowing occurred in the horizontal or vertical plane. The honeybees' speed decreased or increased whenever the minimum cross-section decreased or increased. In other words, the larger sum of the two opposite optic flows in the horizontal and vertical planes was kept practically constant thanks to the speed control performed by the honeybees upon encountering a narrowing of the tunnel. The previously described ALIS (“AutopiLot using an Insect-based vision System”) model nicely matches the present behavioral findings. The ALIS model is based on a feedback control scheme that explains how honeybees may keep their speed proportional to the minimum local cross-section of a tunnel, based solely on optic flow processing, without any need for speedometers or rangefinders. The present behavioral findings suggest how flying insects may succeed in adjusting their speed in their complex foraging environments, while at the same time adjusting their distance not only from lateral and ventral objects but also from those located in their dorsal visual field

    Interpolation based “time of travel ” scheme in a Visual Motion Sensor using a small 2D retina.

    No full text
    Abstract—Insects flying abilities based on optic flow (OF) are nice bio-inspired models for Micro Aerial Vehicles (MAVs) endowed with a limited computational power. Most OF sensing robots developed so far have used numerically complex algorithms requiring large computational power often carried out offline. The present study shows the performances of our bioinspired Visual Motion Sensor (VMS) based on a 3x4 matrix of auto-adaptive aVLSI photoreceptors pertaining to a custommade bio-inspired chip called APIS (Adaptive Pixels for Insectbased Sensors). To achieve such processing with the limited computational power of a tiny microcontroller (”C), the ”C-based implementation of the “time of travel ” scheme requiring at least a 1kHz sampling rate was modified by linearly interpolating the photoreceptors signals to run the algorithm at a lower sampling rate. The accuracy of the measurements was assessed for various sampling rates in simulation and the best tradeoff between computational load and accuracy determined at 200Hz was implemented onboard a tiny ”C. By interpolating the photoreceptors signals and by fusing the output of several Local Motion Sensors (LMSs), we ended up with an accurate and frequently refreshed VMS measuring a visual angular speed and requiring more than 4 times less computational resources

    A novel 1-gram insect based device measuring visual motion along 5 optical directions

    No full text
    International audienceAutopilots for micro aerial vehicles (MAVs) with a maximum permissible avionic payload of only a few grams need lightweight, low-power sensors to be able to navigate safely when flying through unknown environments. To meet these demanding specifications, we developed a simple functional model for an Elementary Motion Detector (EMD) circuit based on the common housefly's visual system. During the last two decades, several insect-based visual motion sensors have been designed and implemented on various robots, and considerable improvements have been made in terms of their mass, size and power consumption. The new lightweight visual motion sensor presented here generates 5 simultaneous neighboring measurements of the 1-D angular speed of a natural scene within a measurement range of more than one decade [25 °/s; 350°/s]. Using a new sensory fusion method consisting in computing the median value of the 5 local motion units, we ended up with a more robust, more accurate and more frequently refreshed measurement of the 1-D angular speed
    corecore